Entropy and freezing in Gaussian models

نویسندگان

چکیده

A new definition of the freezing phenomenon is given in relation with behavior entropy Gibbs measures at low temperatures. In particular, for uncorrelated and log-correlated Gaussian models, we show that arises when corresponding vanishes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ADK Entropy and ADK Entropy Rate in Irreducible- Aperiodic Markov Chain and Gaussian Processes

In this paper, the two parameter ADK entropy, as a generalized of Re'nyi entropy, is considered and some properties of it, are investigated. We will see that the ADK entropy for continuous random variables is invariant under a location and is not invariant under a scale transformation of the random variable. Furthermore, the joint ADK entropy, conditional ADK entropy, and chain rule of this ent...

متن کامل

The Rate of Entropy for Gaussian Processes

In this paper, we show that in order to obtain the Tsallis entropy rate for stochastic processes, we can use the limit of conditional entropy, as it was done for the case of Shannon and Renyi entropy rates. Using that we can obtain Tsallis entropy rate for stationary Gaussian processes. Finally, we derive the relation between Renyi, Shannon and Tsallis entropy rates for stationary Gaussian proc...

متن کامل

Maximum Entropy and Gaussian Models for Image Object Recognition

The principle of maximum entropy is a powerful framework that can be used to estimate class posterior probabilities for pattern recognition tasks. In this paper, we show how this principle is related to the discriminative training of Gaussian mixture densities using the maximum mutual information criterion. This leads to a relaxation of the constraints on the covariance matrices to be positive ...

متن کامل

A Gaussian Prior for Smoothing Maximum Entropy Models

In certain contexts, maximum entropy (ME) modeling can be viewed as maximum likelihood training for exponential models, and like other maximum likelihood methods is prone to over tting of training data. Several smoothing methods for maximum entropy models have been proposed to address this problem, but previous results do not make it clear how these smoothing methods compare with smoothing meth...

متن کامل

Two Entropy-Based Methods for Learning Unsupervised Gaussian Mixture Models

In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (ExpectationMaximization) algorithm yields the maximum-likelihood solution it requires a careful initialization of the parameters and the optimal number of kernels in the mixture may be unknown beforehand. We propose a criterion based on the entropy of the pdf (probability density func...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Mathematical Physics

سال: 2022

ISSN: ['0022-2488', '1527-2427', '1089-7658']

DOI: https://doi.org/10.1063/5.0089784